431 research outputs found
Conceptual modelling: Towards detecting modelling errors in engineering applications
Rapid advancements of modern technologies put high demands on mathematical modelling of engineering systems. Typically, systems are no longer “simple” objects, but rather coupled systems involving multiphysics phenomena, the modelling of which involves coupling of models that describe different phenomena. After constructing a mathematical model, it is essential to analyse the correctness of the coupled models and to detect modelling errors compromising the final modelling result. Broadly, there are two classes of modelling errors: (a) errors related to abstract modelling, eg, conceptual errors concerning the coherence of a model as a whole and (b) errors related to concrete modelling or instance modelling, eg, questions of approximation quality and implementation. Instance modelling errors, on the one hand, are relatively well understood. Abstract modelling errors, on the other, are not appropriately addressed by modern modelling methodologies. The aim of this paper is to initiate a discussion on abstract approaches and their usability for mathematical modelling of engineering systems with the goal of making it possible to catch conceptual modelling errors early and automatically by computer assistant tools. To that end, we argue that it is necessary to identify and employ suitable mathematical abstractions to capture an accurate conceptual description of the process of modelling engineering systems
Synthesis of Recursive ADT Transformations from Reusable Templates
Recent work has proposed a promising approach to improving scalability of
program synthesis by allowing the user to supply a syntactic template that
constrains the space of potential programs. Unfortunately, creating templates
often requires nontrivial effort from the user, which impedes the usability of
the synthesizer. We present a solution to this problem in the context of
recursive transformations on algebraic data-types. Our approach relies on
polymorphic synthesis constructs: a small but powerful extension to the
language of syntactic templates, which makes it possible to define a program
space in a concise and highly reusable manner, while at the same time retains
the scalability benefits of conventional templates. This approach enables
end-users to reuse predefined templates from a library for a wide variety of
problems with little effort. The paper also describes a novel optimization that
further improves the performance and scalability of the system. We evaluated
the approach on a set of benchmarks that most notably includes desugaring
functions for lambda calculus, which force the synthesizer to discover Church
encodings for pairs and boolean operations
Existential Types for Relaxed Noninterference
Information-flow security type systems ensure confidentiality by enforcing
noninterference: a program cannot leak private data to public channels.
However, in practice, programs need to selectively declassify information about
private data. Several approaches have provided a notion of relaxed
noninterference supporting selective and expressive declassification while
retaining a formal security property. The labels-as-functions approach provides
relaxed noninterference by means of declassification policies expressed as
functions. The labels-as-types approach expresses declassification policies
using type abstraction and faceted types, a pair of types representing the
secret and public facets of values. The original proposal of labels-as-types is
formulated in an object-oriented setting where type abstraction is realized by
subtyping. The object-oriented approach however suffers from limitations due to
its receiver-centric paradigm.
In this work, we consider an alternative approach to labels-as-types,
applicable in non-object-oriented languages, which allows us to express
advanced declassification policies, such as extrinsic policies, based on a
different form of type abstraction: existential types. An existential type
exposes abstract types and operations on these; we leverage this abstraction
mechanism to express secrets that can be declassified using the provided
operations. We formalize the approach in a core functional calculus with
existential types, define existential relaxed noninterference, and prove that
well-typed programs satisfy this form of type-based relaxed noninterference
Conceptual modelling: Towards detecting modelling errors in engineering applications
Rapid advancements of modern technologies put high demands on mathematical modelling of engineering systems. Typically, systems are no longer “simple” objects, but rather coupled systems involving multiphysics phenomena, the modelling of which involves coupling of models that describe different phenomena. After constructing a mathematical model, it is essential to analyse the correctness of the coupled models and to detect modelling errors compromising the final modelling result. Broadly, there are two classes of modelling errors: (a) errors related to abstract modelling, eg, conceptual errors concerning the coherence of a model as a whole and (b) errors related to concrete modelling or instance modelling, eg, questions of approximation quality and implementation. Instance modelling errors, on the one hand, are relatively well understood. Abstract modelling errors, on the other, are not appropriately addressed by modern modelling methodologies. The aim of this paper is to initiate a discussion on abstract approaches and their usability for mathematical modelling of engineering systems with the goal of making it possible to catch conceptual modelling errors early and automatically by computer assistant tools. To that end, we argue that it is necessary to identify and employ suitable mathematical abstractions to capture an accurate conceptual description of the process of modelling engineering system
Solving Tree Problems with Category Theory
Artificial Intelligence (AI) has long pursued models, theories, and
techniques to imbue machines with human-like general intelligence. Yet even the
currently predominant data-driven approaches in AI seem to be lacking humans'
unique ability to solve wide ranges of problems. This situation begs the
question of the existence of principles that underlie general problem-solving
capabilities. We approach this question through the mathematical formulation of
analogies across different problems and solutions. We focus in particular on
problems that could be represented as tree-like structures. Most importantly,
we adopt a category-theoretic approach in formalising tree problems as
categories, and in proving the existence of equivalences across apparently
unrelated problem domains. We prove the existence of a functor between the
category of tree problems and the category of solutions. We also provide a
weaker version of the functor by quantifying equivalences of problem categories
using a metric on tree problems.Comment: 10 pages, 4 figures, International Conference on Artificial General
Intelligence (AGI) 201
Yukawa unification in SO(10) with light sparticle spectrum
We investigate supersymmetric SO(10) GUT model with \mu<0. The requirements
of top-bottom-tau Yukawa unification, correct radiative electroweak symmetry
breaking and agreement with the present experimental data may be met when the
soft masses of scalars and gauginos are non-universal. We show how appropriate
non-universalities can easily be obtained in the SO(10) GUT broken to the
Standard Model. We discuss how values of BR(b-->s \gamma) and (g-2)_\mu
simultaneously in a good agreement with the experimental data can be achieved
in SO(10) model with \mu<0. In the region of the parameter space preferred by
our analysis there are two main mechanisms leading to the LSP relic abundance
consistent with the WMAP results. One is the co-annihilation with the stau and
the second is the resonant annihilation via exchange of the Z boson or the
light Higgs scalar. A very interesting feature of SO(10) models with negative
\mu is that they predict relatively light sparticle spectra. Even the heaviest
superpartners may easily have masses below 1.5 TeV in contrast to multi-TeV
particles typical for models with positive \mu.Comment: 23 pages, 5 figure
SUSY parameter determination at the LHC using cross sections and kinematic edges
We study the determination of supersymmetric parameters at the LHC from a
global fit including cross sections and edges of kinematic distributions. For
illustration, we focus on a minimal supergravity scenario and discuss how well
it can be constrained at the LHC operating at 7 and 14 TeV collision energy,
respectively. We find that the inclusion of cross sections greatly improves the
accuracy of the SUSY parameter determination, and allows to reliably extract
model parameters even in the initial phase of LHC data taking with 7 TeV
collision energy and 1/fb integrated luminosity. Moreover, cross section
information may be essential to study more general scenarios, such as those
with non-universal gaugino masses, and distinguish them from minimal,
universal, models.Comment: 22 pages, 8 figure
Superpartner spectrum of minimal gaugino-gauge mediation
We evaluate the sparticle mass spectrum in the minimal four-dimensional
construction that interpolates between gaugino and ordinary gauge mediation at
the weak scale. We find that even in the hybrid case -- when the messenger
scale is comparable to the mass of the additional gauge particles -- both the
right-handed as well as the left-handed sleptons are lighter than the bino in
the low-scale mediation regime. This implies a chain of lepton production and,
consequently, striking signatures that may be probed at the LHC already in the
near future.Comment: 8 pages, 3 figures; V2: refs and a few comments added; V3 title
change
Needle & knot : binder boilerplate tied up
To lighten the burden of programming language mechanization, many approaches have been developed that tackle the substantial boilerplate which arises from variable binders. Unfortunately, the existing approaches are limited in scope. They typically do not support complex binding forms (such as multi-binders) that arise in more advanced languages, or they do not tackle the boilerplate due to mentioning variables and binders in relations. As a consequence, the human mechanizer is still unnecessarily burdened with binder boilerplate and discouraged from taking on richer languages.
This paper presents Knot, a new approach that substantially extends the support for binder boilerplate. Knot is a highly expressive language for natural and concise specification of syntax with binders. Its meta-theory constructively guarantees the coverage of a considerable amount of binder boilerplate for well-formed specifications, including that for well-scoping of terms and context lookups. Knot also comes with a code generator, Needle, that specializes the generic boilerplate for convenient embedding in COQ and provides a tactic library for automatically discharging proof obligations that frequently come up in proofs of weakening and substitution lemmas of type-systems.
Our evaluation shows, that Needle & Knot significantly reduce the size of language mechanizations (by 40% in our case study). Moreover, as far as we know, Knot enables the most concise mechanization of the POPLmark Challenge (1a + 2a) and is two-thirds the size of the next smallest. Finally, Knot allows us to mechanize for instance dependentlytyped languages, which is notoriously challenging because of dependent contexts and mutually-recursive sorts with variables
- …